Tools

Tools are LLM-callable functions assembled at startup from config.toml. Each row below lists the public tool name exposed to the model, how it is enabled, and the intended use.

Tool Surface

Group

Tool

Availability

Purpose

Chat history

chat_history_info

Always available

Inspect the current conversation history size.

Chat history

chat_history_trim

Always available

Remove old history entries for the current conversation.

Memory

memory

[tools.kv_memory]

Save, retrieve, search, list, and delete persistent user notes.

Utility

current_datetime

[tools.time]; enabled by default

Return the current UTC datetime using an optional strftime format.

Utility

wait

[tools.wait]

Pause execution for a given number of milliseconds (clamped to max_milliseconds).

Utility

calculate_expression

[tools.calculator]; enabled by default

Evaluate bounded arithmetic with Decimal precision.

HTTP

http_request

[tools.http_client]

Fetch HTTP/HTTPS resources with size limits and optional managed-file spillover.

Host execution

python_execute

[tools.python_exec]

Run host Python code with timeout, output caps, sandbox mode, and artifact export.

Host execution

python_environment_info

[tools.python_exec]

Inspect the Python runtime used by python_execute.

Host execution

bash

[tools.bash]

Run shell commands through /bin/bash -lc.

Editing

apply_patch

[tools.apply_patch]

Apply structured add/update/delete/move patches under the configured workspace.

Managed files

filesystem

[tools.file_storage]

Unified managed-file facade for list, glob, info, write, move, delete, and send.

Managed files

glob_files

[tools.file_storage]

List managed files matching a glob pattern.

Managed files

read_file

[tools.file_storage]

Read a complete managed text file.

Managed files

code_read

[tools.file_storage]

Read a bounded line window from a managed text file.

Managed files

grep

[tools.grep] and [tools.file_storage]

Search managed files with regex or fixed-string matching.

Managed files

self_insert_artifact

[tools.file_storage]

Inject a managed file or image into the active runtime context.

Audio

transcribe_audio

[tools.audio_transcription] and [tools.file_storage]

Transcribe or translate managed audio files with faster-whisper.

Scheduled prompts

schedule

[scheduler.prompts]

Unified facade for create, list, cancel, and delete scheduled prompts.

Scheduled prompts

schedule_prompt

[scheduler.prompts]

Create a one-time or recurring scheduled prompt.

Scheduled prompts

list_scheduled_prompts

[scheduler.prompts]

List scheduled prompts for the current owner/chat context.

Scheduled prompts

cancel_scheduled_prompt

[scheduler.prompts]

Mark a scheduled prompt as cancelled.

Scheduled prompts

delete_scheduled_prompt

[scheduler.prompts]

Cancel and remove a scheduled prompt.

Delegation

fetch_agent_info

Enabled when agent definitions exist

Inspect a specialist agent definition.

Delegation

invoke_agent

Enabled when agent definitions exist

Delegate a task to a specialist agent.

Skills

list_skills

[tools.skills]

Discover current skills from configured skill directories.

Skills

activate_skill

[tools.skills]

Load full instructions for a discovered skill.

Async tasks

spawn_task

[tools.tasks] and [rabbitmq]

Queue a background worker task.

Async tasks

cancel_task

[tools.tasks] and [rabbitmq]

Cancel an active background task by ID.

Async tasks

list_tasks

[tools.tasks] and [rabbitmq]

List active background tasks.

MCP

mcp_<server>__<remote_tool>

[tools.mcp]

Dynamically discovered Model Context Protocol tools.

Runtime Notes

  • Tool defaults are defined in minibot.adapters.config.schema and configured in config.toml.

  • [tools.file_storage] is the shared managed-file root used by file, grep, HTTP spillover, and audio tools.

  • [tools.audio_transcription] requires the stt extra: poetry install --extras stt.

  • [tools.mcp] requires the mcp extra: poetry install --extras mcp.

  • [tools.tasks] requires the rabbitmq extra and [rabbitmq].enabled = true.

  • Hidden compatibility aliases are normalized at execution time; prefer the public names in the table.

Implementation Reference

Key-value memory tools for persistent user data.

Enabled by [tools.kv_memory] in config.toml.

build_kv_tools() returns the memory LLM tool binding, which supports the following actions: save, get, search, delete, and list_titles.

Each entry has an entry_id, title, data, optional metadata (JSON), source, and expires_at. Listing supports limit and offset for pagination.

Key config options:

  • sqlite_url — SQLite backend (default: sqlite+aiosqlite:///./data/kv_memory.db).

  • default_limit / max_limit — pagination controls.

  • default_owner_id — fallback owner when no user context is available.

class minibot.llm.tools.http_client.HTTPClientTool(config: HTTPClientToolConfig, storage: LocalFileStorage | None = None)

Fetch HTTP/HTTPS resources on behalf of the LLM.

Enabled by [tools.http_client] in config.toml.

Exposes the http_request LLM tool supporting GET, POST, PUT, DELETE, PATCH, HEAD, and OPTIONS.

Response handling:

  • HTML responses are converted to plain text by default (response_processing_mode = "auto").

  • Whitespace is normalized when normalize_whitespace = true.

  • When the decoded body exceeds spill_after_chars characters and [tools.file_storage] is enabled, the full body is saved to a managed temp file and a short preview is returned inline.

Key config options:

  • timeout_seconds, max_bytes — request limits.

  • max_chars — inline body character cap.

  • spill_to_managed_file, spill_after_chars, spill_preview_chars, max_spill_bytes.

class minibot.llm.tools.file_storage.FileStorageTool(storage: LocalFileStorage, event_bus: EventBus | None = None)

Managed file operations scoped to a configured root directory.

Enabled by [tools.file_storage] in config.toml.

Exposes four LLM tools:

  • filesystem — unified facade: list, glob, info, write, move, delete, send.

  • glob_files — list files matching a glob pattern.

  • read_file — read a full text file.

  • self_insert_artifact — inject a managed file or image into the active conversation context.

All paths are relative to root_dir. allow_outside_root = false prevents path traversal. Incoming uploads are saved to uploads_subdir when save_incoming_uploads = true.

Key config options:

  • root_dir — managed storage root.

  • max_write_bytes — per-write size limit.

  • allow_outside_root — disable path-escape guard (not recommended).

class minibot.llm.tools.grep.GrepTool(storage: LocalFileStorage, config: GrepToolConfig)

Regex or fixed-string search over managed files.

Enabled by [tools.file_storage] in config.toml (requires file storage). The grep config is read from [tools.grep].

Exposes the grep LLM tool.

Supports recursive search, case-insensitive matching, fixed-string (literal) mode, hidden file inclusion, and configurable context lines before/after each match. Files exceeding max_file_size_bytes are skipped.

Key config options:

  • max_matches — global per-call match cap.

  • max_file_size_bytes — skip files larger than this.

class minibot.llm.tools.bash.BashTool(config: BashToolConfig)

Execute shell commands via /bin/bash -lc.

Enabled by [tools.bash] in config.toml.

Key config options:

  • default_timeout_seconds / max_timeout_seconds — execution time limits.

  • pass_parent_env — pass the full parent environment; when false, only keys in env_allowlist are forwarded.

  • max_output_bytes — combined stdout+stderr cap; excess is truncated.

Returns ok, exit_code, stdout, stderr, timed_out, truncated, and duration_ms.

class minibot.llm.tools.python_exec.HostPythonExecTool(config: PythonExecToolConfig, storage: LocalFileStorage | None = None)

Execute Python code on the host with configurable sandbox controls.

Enabled by [tools.python_exec] in config.toml.

Exposes two LLM tools: python_execute and python_environment_info.

Sandbox modes (sandbox_mode):

  • basic — no isolation, same process user.

  • rlimit — POSIX resource limits (CPU, memory, file size, open files).

  • cgroup — Linux only; wraps execution with systemd-run --scope.

  • jail — arbitrary command prefix (e.g. nsjail, firejail).

Artifacts: when save_artifacts = true and [tools.file_storage] is enabled, generated files matching artifact_globs are copied into managed storage after execution.

Key config options:

  • default_timeout_seconds / max_timeout_seconds

  • max_output_bytes — combined stdout+stderr cap.

  • python_path / venv_path — override the Python interpreter.

  • artifacts_enabled, artifacts_default_subdir, artifacts_max_files.

class minibot.llm.tools.audio_transcription.AudioTranscriptionTool(config: AudioTranscriptionToolConfig, storage: LocalFileStorage, facade: AudioTranscriptionFacade | None = None)

Transcribe or translate audio files using faster-whisper.

Enabled by [tools.audio_transcription] in config.toml. Requires the stt extra: poetry install --extras stt. Also requires [tools.file_storage] to resolve managed audio paths.

Exposes the transcribe_audio LLM tool with two task modes:

  • transcribe — output in the source language.

  • translate — output translated to English.

Auto-transcription: when auto_transcribe_short_incoming = true, incoming voice messages shorter than auto_transcribe_max_duration_seconds are transcribed automatically before the LLM processes them.

Key config options:

  • model — Whisper model size (tiny, base, small, medium, large-v3).

  • deviceauto, cpu, or cuda.

  • compute_type — quantization (int8, float16, etc.).

  • beam_size, vad_filter.

class minibot.llm.tools.scheduler.SchedulePromptTool(service: ScheduledPromptService, min_recurrence_interval_seconds: int = 60)

Create, list, cancel, and delete scheduled prompt jobs.

Enabled by [scheduler.prompts] in config.toml.

Exposes five LLM tools:

  • schedule — unified action interface (create / list / cancel / delete).

  • schedule_prompt — create a future prompt job.

  • list_scheduled_prompts — list jobs for the current owner/chat context.

  • cancel_scheduled_prompt — mark a job as cancelled.

  • delete_scheduled_prompt — cancel and permanently remove a job.

Recurrence: set recurrence_type = "interval" and provide recurrence_interval_seconds (minimum enforced by min_recurrence_interval_seconds).

Prompt roles: user (bot answers), assistant (sent directly to the user), system, or developer.

Key config options:

  • sqlite_url — SQLite backend for job persistence.

  • poll_interval_seconds — how often due jobs are dispatched.

  • min_recurrence_interval_seconds — floor for recurring intervals.

class minibot.llm.tools.mcp_bridge.MCPToolBridge(*, server_name: str, client: MCPClient, name_prefix: str = 'mcp', enabled_tools: list[str] | None = None, disabled_tools: list[str] | None = None)

Bridge Model Context Protocol (MCP) servers to LLM tool bindings.

Enabled by [tools.mcp] in config.toml. Each entry under [[tools.mcp.servers]] creates one bridge instance. Requires the mcp extra: poetry install --extras mcp.

Remote tools are exposed with the naming convention:

<name_prefix>_<server_name>__<remote_tool_name>

The default prefix is mcp.

Filtering:

  • enabled_tools — whitelist; only listed remote tool names are exposed.

  • disabled_tools — blacklist; listed names are always excluded.

Key config options:

  • name_prefix — prefix for all bridged tool names (default: "mcp").

  • timeout_seconds — call timeout.

  • [[tools.mcp.servers]] — list of server definitions; each supports transport, command, args, env, cwd, url, headers, enabled_tools, disabled_tools.

class minibot.llm.tools.skill_loader.SkillLoaderTool(registry: SkillRegistry)

Discover and load agent skills at runtime.

Enabled when [tools.skills] is configured in config.toml (skill discovery paths are set via skills.paths).

Exposes two LLM tools:

  • list_skills — returns up to 8 skills ranked by relevance to an optional search query. Ranking order: exact match → prefix → substring → description substring → fuzzy similarity.

  • activate_skill — loads the full instructions for a named skill and injects them into the context.

Skills are markdown files (SKILL.md) discovered from the configured paths; the registry refreshes automatically when files change.