Build With Iris.Md.
Everything you need to install, configure, and operate your own autonomous agent.
Getting Started.
Install the Iris.Md agent on Linux, macOS, or WSL with a single shell command.
Read →Boot the agent, connect a model, and have your first conversation.
Read →Configure models, channels, sandboxes, and memory through ~/.iris/config.toml.
Read →Bring your own keys for OpenAI, Anthropic, Google, or any local OpenAI-compatible endpoint.
Read →Core Concepts.
Spawn isolated subagents with their own context, terminal, and lifecycle.
Read →Long-term memory that survives restarts and gets richer the longer the agent runs.
Read →Auto-generated, reusable procedures the agent writes for itself.
Read →Five backends: local, Docker, SSH, Singularity, and Modal.
Read →Integrations.
Talk to your agent from any Telegram chat — DMs, groups, or channels.
Read →Run the agent as a Discord bot with slash-command support.
Read →Mention the agent in any Slack channel or DM it directly.
Read →Connect via the WhatsApp Cloud API for end-to-end agent messaging.
Read →Send and receive over IMAP/SMTP — perfect for scheduled briefings.
Read →The terminal is a first-class channel.
Read →Open PRs, triage issues, and review diffs straight from the agent.
Read →Create, update, and triage Linear issues from any conversation.
Read →Read and write Notion pages and databases as long-term shared memory.
Read →Gmail, Calendar, Drive, and Docs through a single OAuth connection.
Read →Post, reply, and monitor mentions on X via the v2 API.
Read →Federated, end-to-end encrypted chat for self-hosted deployments.
Read →Private, end-to-end encrypted messaging via signal-cli.
Read →Generic inbound/outbound webhooks for any service not listed above.
Read →