McClaw Documentation
McClaw is a native macOS AI assistant that brings Claude, ChatGPT, Gemini, Ollama, DashScope (Qwen), Copilot, Codex, and BitNet together in a single app. Instead of using API keys, McClaw wraps the official CLI tools you already have installed, adding a beautiful native interface with voice, canvas, git integration, native channels, connectors, and automation on top.
System Requirements
- macOS 15 (Sequoia) or later
- Apple Silicon or Intel Mac (Universal Binary)
- At least one AI CLI installed (Claude, ChatGPT, Gemini, Ollama, DashScope, Copilot, Codex, or BitNet)
Architecture Overview
McClaw uses a CLI Bridge architecture. Rather than calling AI provider APIs directly, it communicates with the official command-line tools each provider offers. This means:
- McClaw never handles your API keys or credentials
- Authentication is managed by each CLI with its own provider
- You get the same capabilities as the official CLI, plus a native UI
- Streaming responses are parsed in real time from CLI output
Installation
There are two ways to install McClaw:
Option 1: Download the App
McClaw will be distributed as a Universal Binary that runs natively on both Apple Silicon and Intel Macs. The download will be available here when the app launches.
Drag McClaw.app to your /Applications folder.
Open McClaw from your Applications folder or Spotlight. On first launch, macOS may ask you to confirm you want to open the app since it was downloaded from the internet.
Option 2: Build from Source
If you prefer to build McClaw yourself, you need Xcode 16+ with Swift 6.0:
# Source code will be available when the project goes public
# git clone https://github.com/joseconti/mc-claw
# cd mc-claw
# ./scripts/build-app.sh
This generates build/McClaw.app with the complete bundle, including Info.plist, app icon, Sparkle framework for auto-updates, and code signing.
./scripts/build-app.sh to build the app, not swift build alone. The script creates the full macOS app bundle. Running swift build only produces the binary without the required bundle structure.
Auto Updates
McClaw uses the Sparkle framework for automatic updates. Once installed, you can check for updates from the menu bar. Updates are downloaded and applied in the background.
CLI Setup
McClaw requires at least one AI CLI tool to be installed on your system. It auto-detects installed CLIs by checking standard installation paths and your shell environment.
Supported Providers
Claude CLI
Full-featured provider with native task scheduling and MCP server support.
npm install -g @anthropic-ai/claude-code
Authenticate with:
claude auth login
ChatGPT CLI
OpenAI's CLI for GPT-4 and later models.
brew install openai/chatgpt/chatgpt
Follow the CLI prompts to authenticate with your OpenAI account.
Gemini CLI
Google's AI CLI with MCP configuration support via ~/.gemini/settings.json.
npm install -g @anthropic-ai/gemini-cli
Authenticate with your Google account when prompted.
Ollama
Run AI models locally with complete privacy. No cloud dependency.
brew install ollama
ollama pull llama3
No authentication needed. Models run entirely on your machine.
DashScope (Qwen)
Alibaba Cloud's Qwen family of models. Powerful multilingual AI with vision and coding capabilities.
# Install via pip
pip install dashscope
Authenticate with your DashScope API key:
export DASHSCOPE_API_KEY=sk-xxxxxxxx
Get your API key from the DashScope Console. Supports Qwen-Max, Qwen-Plus, Qwen-Turbo, and vision models.
Copilot CLI
GitHub Copilot's CLI for code-focused AI assistance and developer workflows.
gh extension install github/gh-copilot
Requires the GitHub CLI (gh) and an active GitHub Copilot subscription. Authenticate with:
gh auth login
Codex
AI pair programming tool. Edits code across multiple files using any LLM backend.
npm install -g @openai/codex
Codex is OpenAI's CLI coding agent. Set your OpenAI API key:
# Use with OpenAI
export OPENAI_API_KEY=sk-xxxxxxxx
# Or use with Anthropic
export ANTHROPIC_API_KEY=sk-ant-xxxxxxxx
BitNet Experimental
Microsoft's 1-bit LLM framework for ultra-efficient local inference on CPU. No GPU required.
McClaw handles the full installation automatically: cloning the repository, creating a Conda environment, downloading the default model (BitNet-b1.58-2B-4T), and compiling optimized kernels for your CPU. You can manage models and settings from Settings → CLIs → BitNet.
No authentication needed. All inference runs locally on your machine.
CLI Detection
McClaw detects installed CLIs automatically by:
- Checking hardcoded standard paths (e.g.
/usr/local/bin,/opt/homebrew/bin) - Falling back to your login shell's
PATH
If a CLI is installed in a non-standard location, McClaw will still find it as long as it is on your shell's PATH.
First Launch
When you open McClaw for the first time, the Onboarding Wizard guides you through initial setup in 6 steps:
Introduction to McClaw and its CLI Bridge architecture.
McClaw scans your system for installed AI CLIs and shows which providers are available.
Pick which AI provider to use by default. You can change this at any time and switch mid-conversation.
McClaw asks for any macOS permissions it needs (microphone for voice mode, etc.). All permissions go through standard macOS TCC prompts.
Configure connectors to integrate with Google, GitHub, Slack, and 20+ other services directly from McClaw.
Setup is complete. McClaw opens to the main chat window.
Configuration
McClaw stores its configuration in ~/.mcclaw/mcclaw.json. This file is created automatically on first launch and includes your preferences, provider settings, and feature configuration. You typically do not need to edit this file directly — everything is configurable from the Settings window.
Chat
The chat interface is the core of McClaw. It provides a full-featured conversation experience powered by your chosen AI provider.
Key Features
- Streaming Responses — Responses appear in real time as the AI generates them, parsed from the CLI's streaming output.
- Markdown Rendering — Full Markdown support with headings, lists, tables, bold, italic, links, and more.
- Code Highlighting — Code blocks are syntax-highlighted with language detection. Copy any code block with one click.
- File Attachments — Drag and drop files into the chat to include them as context for the AI.
- Conversation History — All conversations are saved and searchable in the sidebar.
- Projects — Organize conversations into projects for better management.
Switching Providers
You can switch AI providers at any time, even mid-conversation. The provider selector is accessible from the chat input bar. When you switch, McClaw routes your next message through the new provider's CLI.
Slash Commands
McClaw supports built-in slash commands for quick actions:
| Command | Description |
|---|---|
/status |
Show current provider, model, token usage, and cost |
/new / /reset |
Start a clean session |
/compact |
Compress context to reduce token usage |
/think <level> |
Set thinking mode (off/minimal/low/medium/high/xhigh) |
/verbose on|off |
Toggle verbose output |
/usage off|tokens|full |
Configure usage tracking display |
/model <name> |
Change the AI model for the current provider |
/cli <name> |
Switch to a different CLI provider |
/help |
Show command reference |
@fetch Inline Data
Type @fetch followed by a connector name in your message to pull live data from connected services directly into the conversation. For example:
@fetch gmail Show me my unread emails from today
McClaw enriches the prompt with the fetched data before sending it to the AI, so the response includes up-to-date information from your services.
Voice Mode
McClaw includes a full voice interaction system built on native macOS frameworks. No cloud speech services are required.
Modes
- Push-to-Talk — Hold a configurable hotkey to speak. Release to send your message to the AI.
- Wake Word — Say a custom wake word (default: "Hey McClaw") to activate listening. McClaw listens for the wake word in the background using minimal resources.
- Always-On — Continuous listening mode. McClaw processes everything you say and sends it as chat input.
Speech Recognition
Voice input uses SFSpeechRecognizer, Apple's native on-device speech recognition framework. This means:
- Speech processing happens locally on your Mac
- No audio is sent to third-party servers
- Works offline (with on-device models)
- Supports all languages that macOS speech recognition supports
Speech Synthesis
AI responses can be read aloud using NSSpeechSynthesizer. You can choose from any macOS system voice in the Voice Settings.
Voice Overlay
When voice mode is active, a floating overlay shows the current state (listening, processing, speaking) with a visual waveform. The overlay is non-intrusive and can be positioned anywhere on screen.
Permissions
Voice mode requires microphone access. macOS will show a standard permission prompt the first time you activate voice mode. You can manage this in System Settings > Privacy & Security > Microphone.
Canvas
Canvas lets the AI generate interactive HTML content that renders in a floating panel alongside your chat.
How It Works
- Ask the AI to create something visual (a chart, a form, a mini app, a diagram).
- The AI generates HTML/CSS/JavaScript code.
- McClaw renders it in a floating NSPanel with an embedded WKWebView.
- The canvas updates in real time as the AI streams its response.
Features
- Hot Reload — A file watcher monitors canvas content and reloads the preview instantly when changes are detected.
- JavaScript Bridge — Canvas content can communicate with McClaw through a JS bridge, enabling system integration (file access, clipboard, notifications).
- Custom Scheme Handler — Canvas uses a custom URL scheme (
mcclaw://) for loading local resources securely. - Floating Panel — The canvas window floats above other windows and can be resized, moved, or pinned.
MCP Servers
The Model Context Protocol (MCP) allows AI providers to connect to external tools and data sources. McClaw provides a visual interface for managing MCP server configurations.
Provider Support
| Provider | Transport | Scope | Config Location |
|---|---|---|---|
| Claude | stdio, SSE, streamable-http | User or Project | Managed via claude mcp CLI |
| Gemini | stdio only | User | ~/.gemini/settings.json |
| ChatGPT | MCP not yet supported by the CLI | ||
| Ollama | MCP not yet supported | ||
| DashScope | MCP not supported | ||
| Copilot | MCP not supported | ||
| Codex | MCP not supported | ||
| BitNet | MCP not supported (local 1-bit inference only) | ||
Managing MCP Servers
Open Settings > MCP Servers to add, edit, or remove MCP server configurations. McClaw provides a form-based editor so you do not need to edit JSON files manually.
For each MCP server, you configure:
- Name — A human-readable identifier
- Command — The executable to run (for stdio transport)
- Arguments — Command-line arguments
- Environment Variables — Environment variables to pass to the server process
- Transport — stdio, SSE, or streamable-http (Claude only)
- Scope — User-level or project-level (Claude only)
Claude MCP Commands
For the Claude provider, McClaw uses the official CLI commands under the hood:
# Add a server
claude mcp add server-name -- command arg1 arg2
# List configured servers
claude mcp list
# Remove a server
claude mcp remove server-name
Schedules & Automation
McClaw lets you schedule recurring or one-time AI tasks. All schedules are managed locally by the app and persist across restarts.
Schedule Types
- Preset Intervals — Quick presets: every 15 min, 30 min, 1 hour, 6 hours, 12 hours, daily, weekly, monthly.
- Flexible Intervals — "Every X [minutes|hours|days|weeks|months|years] at HH:MM". For example: "Every 2 days at 09:00" or "Every 3 weeks at 14:30". McClaw generates the correct cron expression automatically.
- Cron Expressions — Standard cron syntax for precise scheduling (e.g.
0 9 * * 1-5for weekdays at 9 AM). - One-Time Schedules — Run a task once at a specific date and time. Optionally auto-delete the job after successful execution.
Creating a Schedule
Open Settings > Cron and click New Job. The editor is organized in 4 steps:
Task name, description, and the AI prompt to execute.
Choose the schedule type: preset interval, flexible interval, cron expression, or one-time date.
Select the AI provider, thinking mode, timeout, and session target (current session or isolated).
Configure how results are delivered. You can select multiple delivery methods simultaneously:
- Notifications — Show a macOS system notification when the task completes.
- History — Save results in the schedule's run history log.
- Native Channels — Send results to one or more configured channels (Telegram, Slack, Discord, etc.), each with an optional recipient (chat ID, channel name, etc.).
Connector Enrichment
Scheduled tasks can include @fetch bindings. When the task runs, McClaw automatically fetches the latest data from connected services before sending the prompt to the AI.
Run History
Each job keeps a log of past executions with status (ok, error, skipped), timestamp, duration, and result summary. View history by selecting a job in Settings > Cron.
Connectors
Connectors let McClaw pull live data from external services and include it in your AI conversations. All connectors use OAuth 2.0 with PKCE for authentication, and tokens are stored securely in the macOS Keychain.
Available Connectors
Google Workspace
- Gmail — Read emails, search messages, list labels
- Google Calendar — View events, check availability, create events
- Google Drive — List files, search documents, read content
- Google Contacts — Search and browse contacts
- Google Tasks — View and manage task lists
Development
- GitHub — Repos, issues, PRs, notifications. Powers the Git Integration panel.
- GitLab — Projects, merge requests, pipelines. Powers the Git Integration panel.
- Linear — Issues, projects, cycles
- Jira — Issues, boards, sprints
- Notion — Pages, databases, search
Communication
- Slack — Channels, messages, search
- Discord — Servers, channels, messages
- Telegram — Chats, messages
Microsoft 365
- Outlook Mail — Emails, folders, search
- Outlook Calendar — Events, availability
- OneDrive — Files, folders, search
- Microsoft To Do — Tasks, lists
Productivity
- Todoist — Tasks, projects, labels
- Trello — Boards, cards, lists
- Airtable — Bases, tables, records
- Dropbox — Files, folders, sharing
Utilities
- Weather — Current conditions, forecasts
- RSS — Feed reading and monitoring
- Webhooks — Custom HTTP endpoints
- WordPress — 13 sub-connectors via MCP bridge (posts, pages, media, users, comments, plugins, themes, settings, taxonomies, menus, widgets, blocks, WooCommerce)
Setting Up OAuth Credentials
Google and Microsoft connectors require you to provide your own OAuth Client ID and Client Secret. You only need to create these once per provider — the same credentials are shared across all connectors of the same provider (e.g., one set of Google credentials works for Gmail, Calendar, Drive, Contacts, and Tasks).
Google Workspace Setup
Go to the Google Cloud Console → Create Project page and create a new project (or use an existing one). Give it any name (e.g., "McClaw").
Go to APIs & Services → Library and enable the APIs for the connectors you want to use. Search for each one and click Enable:
- Gmail API
- Google Calendar API
- Google Drive API
- People API (for Contacts)
- Tasks API
Go to APIs & Services → OAuth consent screen:
- User Type: External (or Internal if you have Google Workspace)
- Fill in the app name (e.g., "McClaw"), support email, and developer contact
- Add the scopes for the APIs you enabled (e.g.,
gmail.readonly,calendar.readonly) - Add your Google email as a Test User (required while the app is in "Testing" mode)
Go to APIs & Services → Credentials and click Create Credentials → OAuth client ID:
- Application type: Desktop app
- Name: "McClaw" (or any name you prefer)
Google will generate a Client ID (e.g., 123456-abc.apps.googleusercontent.com) and a Client Secret (e.g., GOCSPX-xxxxxxxx). Copy both values.
Open Settings → Connectors, select any Google connector (e.g., Gmail), paste your Client ID and Client Secret, then click Sign in with Google. The credentials are saved globally and will auto-populate for all other Google connectors.
Microsoft 365 Setup
Go to the Azure Portal → App registrations and click New registration. Name it "McClaw", select Accounts in any organizational directory and personal Microsoft accounts, and set the redirect URI to mcclaw://oauth/callback (type: Public client/native).
In your app registration, go to API permissions → Add a permission → Microsoft Graph → Delegated permissions and add:
Mail.Read(Outlook Mail)Calendars.Read(Outlook Calendar)Files.Read(OneDrive)Tasks.Read(Microsoft To Do)
Go to Certificates & secrets → New client secret. Copy the Value (not the Secret ID). Also copy the Application (client) ID from the Overview page.
Open Settings → Connectors, select any Microsoft connector, paste your Client ID and Client Secret, then click Sign in.
Connecting Other Services
Development, Communication, Productivity, and Utility connectors use API keys, Personal Access Tokens, or Bot Tokens. Below are direct links to create the credentials for each service:
Development Connectors
| Service | Credential Type | Where to Create | Format / Notes |
|---|---|---|---|
| GitHub | Personal Access Token | GitHub → Settings → Developer settings → Tokens | ghp_... (classic) or github_pat_... (fine-grained). Scopes: repo, read:user, notifications. |
| GitLab | Personal Access Token | GitLab → User Settings → Access Tokens | glpat-.... Scopes: read_user, read_api. |
| Linear | API Key | Linear → Settings → API | Create a Personal API Key. Full read access to your workspace. |
| Jira | API Token | Atlassian → Account → Security → API tokens | Format in McClaw: your-email@example.com:your-api-token. Also enter your Jira domain (e.g., mycompany). |
| Notion | Integration Token | Notion → My Integrations | Create a new integration, copy the Internal Integration Secret. Remember to share pages with your integration in Notion. |
Communication Connectors
| Service | Credential Type | Where to Create | Format / Notes |
|---|---|---|---|
| Slack | Bot Token | Slack API → Your Apps | xoxb-.... Create a Slack App, add Bot Token Scopes (chat:write, channels:read, groups:read), install to workspace, copy the Bot User OAuth Token. |
| Discord | Bot Token | Discord Developer Portal → Applications | Create an application, go to Bot section, click Reset Token to generate a new one. Enable Message Content Intent in the Bot settings. |
| Telegram | Bot Token | Telegram → @BotFather | Format: 123456789:ABCdefGHI.... Send /newbot to @BotFather and follow the instructions. |
Productivity & Utility Connectors
| Service | Credential Type | Where to Create | Format / Notes |
|---|---|---|---|
| Todoist | API Token | Todoist → Settings → Integrations → Developer | Copy your personal API token from the Developer tab. |
| Trello | API Key + Token | Trello → Power-Ups Admin | Create a Power-Up to get an API Key, then generate a Token via the authorization link on the same page. |
| Airtable | Personal Access Token | Airtable → Developer Hub → Tokens | Create a new token with data.records:read and schema.bases:read scopes. |
| Dropbox | Access Token | Dropbox → App Console | Create an app, choose "Scoped access" with "Full Dropbox" access, then generate an access token in the Settings tab. |
| Weather | API Key | OpenWeatherMap → API Keys | Free tier available. Sign up and copy your API key. |
Native Channels (Self-Hosted Platforms)
These connectors require a server URL plus a token or API key:
| Service | Credential Type | Where to Create | Notes |
|---|---|---|---|
| Matrix | Access Token | Element → Settings → Help & About → Access Token | Or generate via the Matrix API: POST /_matrix/client/r0/login. Also enter your homeserver URL. |
| Mattermost | Personal Access Token | Account Settings → Security → Personal Access Tokens | Your admin must enable PATs in the System Console. Also enter your Mattermost server URL. |
| Mastodon | Access Token | Preferences → Development → New Application | Create an app on your instance, copy the access token. Also enter your instance URL (e.g., https://mastodon.social). |
| Zulip | API Key | Settings → Your Bots → Add a new bot | Copy the bot's API Key and email. Also enter your Zulip server URL. |
| Rocket.Chat | Auth Token | My Account → Security → Personal Access Tokens | Generate a token, copy both Token and User ID. Also enter your Rocket.Chat server URL. |
| Twitch | OAuth Token + Client ID | Twitch Developer Console → Applications | Register an application, copy the Client ID. Generate an OAuth token with chat:read and chat:edit scopes. |
Using Connectors in Chat
Once connected, use connectors in your conversations:
@fetch gmail— Include your latest emails in the conversation@fetch calendar— Include today's events@fetch github— Include recent notifications or PRs
You can also use the Connector Action Picker in the chat input bar to browse available actions visually.
Token Management
Connector tokens are stored in the macOS Keychain and refreshed automatically when they expire. You can revoke access to any connector from Settings > Connectors at any time.
Git Integration
McClaw includes a dedicated Git section in the sidebar that lets you browse your GitHub and GitLab repositories, inspect branches, pull requests, issues, and commits — all within the app. Combined with AI chat, you can ask questions about your repos and execute Git operations through natural language.
Enabling Git
The Git section is disabled by default. To enable it:
Go to Settings → Connectors and set up your GitHub or GitLab connector with a Personal Access Token.
Go to Settings → Features and toggle Show Git section in sidebar.
A new Git item appears in the sidebar. Click it to see your repositories, sorted by last update. Use the platform selector to switch between GitHub and GitLab.
Repository Browser
The Git panel shows your repositories with search and sort controls. Each repo displays:
- Repository name, visibility (public/private), and fork status
- Primary language with color indicator
- Star count and open PR count
- Last updated time
- Local clone indicator (if McClaw detects a local clone on your machine)
Repository Detail
Double-click a repository to open the detail view with four tabs:
- Branches — All branches with default/protected indicators, ahead/behind counts. Click a branch to set it as the active context.
- Pull Requests — Open, merged, and closed PRs with author, source/target branches, and review state badges.
- Issues — Open and closed issues with labels, assignees, and relative timestamps.
- Commits — Recent commit history with SHA, message, author, and date.
Git Context in Chat
When you select a repository, McClaw creates a Git context that appears as a chip in the chat input bar (e.g., owner/repo / main). This context is automatically injected into your AI prompts, so the AI knows which repository and branch you are working with.
You can:
- Ask questions about the repo: "What are the open PRs for this repo?"
- Request Git operations: "Show me the diff for the last commit"
- Run local Git commands on cloned repos via the
@git()intercept
@git() Commands
When a repository has a local clone, the AI can execute Git commands through the @git() intercept — similar to how @fetch() works for connectors. The AI includes @git(status), @git(log --oneline -10), or @git(diff) in its response, and McClaw executes them locally and feeds the output back to the AI.
Supported Operations
| Type | Operations | Confirmation |
|---|---|---|
| Read | log, diff, status, blame, show, branch |
No (automatic) |
| Write | clone, checkout, pull, add, commit, push, tag, stash, merge |
Yes (confirmation card) |
| Platform API | List repos, branches, PRs, issues, commits, create releases, close issues, merge PRs | Write actions only |
Platform Selector
If you have both GitHub and GitLab connected, a capsule pill selector at the top of the Git panel lets you switch between platforms. The selector follows the same design as the CLI provider selector in the chat.
Required Token Scopes
For the Git section to work, your Personal Access Token needs the following scopes:
| Platform | Read-Only Scopes | Full Access Scopes |
|---|---|---|
| GitHub (classic) | repo, read:user, notifications |
Same (classic tokens include write by default) |
| GitHub (fine-grained) | Contents: Read, Issues: Read, Pull requests: Read, Metadata: Read, Notifications: Read | Contents: Read & Write, Issues: Read & Write, Pull requests: Read & Write |
| GitLab | read_user, read_api |
read_user, api |
Git AI Features
Beyond basic repository browsing, McClaw provides AI-powered Git workflows with a confirmation pipeline for safety, contextual actions, and automated repository monitoring.
Execution Pipeline
McClaw uses a confirmation-based pipeline for Git operations:
@git-confirm— AI requests a Git operation, McClaw shows a confirmation card with the command details before executing.- Destructive action warnings — Commands like
--force,reset, andrebaseshow highlighted warnings. - Chain depth tracking — Multi-step chains are limited to 10 operations to prevent infinite loops.
- Sensitive content filtering — Commands are validated before execution to prevent credential exposure.
Contextual Actions (Right-Click Menus)
Right-click on any item in the Git panel to access AI-powered actions specific to the item type:
| Item | Available Actions |
|---|---|
| Pull Request | Review, Summarize, Suggest Improvements, Find Conflicts, Post Review, Merge |
| Issue | Analyze, Suggest Fix, Create Branch, Close, Find Related Issues |
| Commit | Explain, Impact Analysis, Revert, Cherry-pick |
| Branch | Compare, Create PR, Delete, Merge |
| File | Explain, Find Usages, Suggest Improvements, Write Tests |
| Lines | Select lines in the code viewer and use "Ask AI about lines X-Y" |
Quick Actions Bar
When a repository is selected, a floating quick actions panel provides one-click access to common AI operations:
- Explain Repo — Get an AI summary of the repository's purpose and structure
- What Broke? — Analyze recent commits for potential regressions
- Changelog — Generate a changelog from recent commits
- Health Check — Comprehensive repo health analysis
- Security Audit — Scan for potential security issues
- Find TODOs — Locate all TODO/FIXME/HACK comments
- This Week — Summary of activity from the last 7 days
Merge Conflict Resolution
When the AI detects merge conflicts, it can suggest resolutions using the confirmation pipeline. Each conflict shows both sides with the AI's proposed resolution, and you approve or reject before applying.
Cross-Repo Intelligence
Hold Cmd+click to select multiple repositories in the sidebar. McClaw adds all selected repos as context for the AI, enabling cross-repository analysis like "Compare the auth implementation between repo A and repo B".
Repository Monitoring
Combine Git Integration with Schedules to automate repository monitoring:
- PR Review Bot — Automatically review new PRs on a schedule
- Stale Branch Cleanup — Identify and alert on branches older than N days
- Security Scans — Periodic dependency and code security audits
- Health Reports — Weekly summaries delivered to Slack, Telegram, or other channels
Prompt Templates
McClaw includes 30+ built-in prompt templates optimized for Git scenarios. These templates provide structured context to the AI, including repository metadata, branch information, PR diffs, and commit history. Templates are used automatically by contextual actions and quick actions.
Node Mode
Node Mode gives the AI access to macOS hardware capabilities and system operations. All actions require proper TCC (Transparency, Consent, and Control) permissions and go through the execution approval flow.
Available Commands
System
system.run— Execute shell commands with the approval flow (deny/allowlist/ask modes)system.which— Resolve executables via PATHsystem.notify— Send macOS notifications via UNUserNotificationCenter
Camera
camera.list— List available cameras (built-in, external)camera.snap— Capture a photo with JPEG encoding and optional resizecamera.clip— Record video (max 30s, optional audio)
Requires Camera permission in System Settings > Privacy & Security.
Screen Recording
screen.record— Record screen via ScreenCaptureKit (H.264 + AAC)
Configurable: screen index, duration (max 60s), FPS (max 30), optional audio. Requires Screen Recording permission.
Location
location.get— Get current location via CLLocationManager
Requires Location permission (when in use).
Canvas Integration
canvas.present— Show the canvas panel with contentcanvas.hide— Hide the canvas panelcanvas.navigate— Load a URL in the canvascanvas.eval— Execute JavaScript in the canvascanvas.snapshot— Take a screenshot of the canvas
Permissions
Node Mode features require macOS permissions that are requested through standard TCC prompts on first use. You can manage all permissions from Settings > Permissions, which shows the current status of each permission with options to request or open System Settings.
DashScope (Qwen)
DashScope is Alibaba Cloud's AI platform, providing access to the Qwen family of models. Qwen models offer strong multilingual capabilities (especially Chinese and English), vision understanding, and code generation.
Setup
Install via pip: pip install dashscope
Sign up at the DashScope Console and create an API key.
Go to Settings → CLIs → DashScope and enter your API key. McClaw stores it securely in the macOS Keychain.
Available Models
| Model | Best For |
|---|---|
| Qwen-Max | Complex reasoning, long documents, code generation |
| Qwen-Plus | Balanced performance and cost |
| Qwen-Turbo | Fast responses, simple tasks |
| Qwen-VL | Vision — image understanding and analysis |
Configuration
DashScope settings are accessible from Settings → CLIs → DashScope:
- API Key — Your DashScope API key (stored in Keychain)
- Default Model — Select from Qwen-Max, Qwen-Plus, Qwen-Turbo
- Temperature — Controls response randomness
- Max Tokens — Maximum response length
Copilot CLI
GitHub Copilot CLI provides AI-powered code assistance through the GitHub CLI ecosystem. It excels at code explanation, shell command suggestions, and developer-focused workflows.
Setup
brew install gh
gh extension install github/gh-copilot
Log in with your GitHub account (requires an active Copilot subscription):
gh auth login
Requirements
- Active GitHub Copilot subscription (Individual, Business, or Enterprise)
- GitHub CLI (
gh) installed and authenticated
Codex
Codex is OpenAI's autonomous coding agent that runs in the terminal. It can write, refactor, and debug code using OpenAI models.
Setup
npm install -g @openai/codex
Set your OpenAI API key:
# OpenAI
export OPENAI_API_KEY=sk-xxxxxxxx
# Anthropic
export ANTHROPIC_API_KEY=sk-ant-xxxxxxxx
# Or use Ollama for local models (no API key needed)
codex --model ollama/llama3
How McClaw Uses Codex
McClaw integrates Codex as a CLI provider, routing messages through the Codex CLI. This means you get Codex's multi-file editing capabilities with McClaw's native UI, voice mode, connectors, and automation on top.
Configuration
Codex settings are accessible from Settings → CLIs → Codex:
- LLM Backend — Select OpenAI, Anthropic, or Ollama
- Model — Choose the specific model (e.g., gpt-4, claude-3, llama3)
- Working Directory — Default directory for code editing sessions
BitNet Experimental
BitNet is an open-source inference framework by Microsoft for 1-bit Large Language Models (1.58-bit LLMs). Unlike traditional models that use 16/32-bit weights, BitNet models restrict all weights to ternary values {-1, 0, +1}, enabling dramatically more efficient inference on CPUs.
How BitNet Differs from Ollama
Both BitNet and Ollama run models locally, but they serve different purposes:
- Ollama runs standard quantized models (4-bit, 8-bit). Large ecosystem of models, higher quality output, more resource-intensive.
- BitNet runs natively ternary models (1.58-bit). Dramatically lower memory and energy consumption, fewer models available, faster on CPU.
System Requirements
BitNet has heavier installation requirements than other providers:
| Requirement | Minimum Version |
|---|---|
| Python | 3.9+ |
| Conda | Any |
| CMake | 3.22+ |
| Clang | 18+ (install via brew install llvm@18) |
| Disk space | ~5 GB (framework + models) |
Works on both Apple Silicon (ARM) and Intel (x86) Macs. McClaw detects your architecture and uses the optimal quantization kernels automatically.
Installation
McClaw handles the full BitNet installation through a guided, multi-phase process accessible from Settings → CLIs → BitNet:
McClaw verifies Python, Conda, CMake, Clang, and Git are installed. If Clang 18+ is missing, it offers to install it via Homebrew.
Clones the BitNet repository to ~/.mcclaw/bitnet, creates a dedicated Conda environment (mcclaw-bitnet), and installs Python dependencies.
Downloads the default model (BitNet-b1.58-2B-4T by Microsoft, ~500 MB) from Hugging Face.
Compiles optimized inference kernels for the downloaded model. The build is model-specific, so each new model requires a rebuild. The full process takes approximately 10–15 minutes.
Available Models
McClaw includes a built-in registry of BitNet-compatible models. You can download and manage them from Settings → CLIs → BitNet → Manage Models:
| Model | Parameters | Size | Notes |
|---|---|---|---|
| BitNet 2B (Official) | 2.4B | ~500 MB | Default. Microsoft's official 1-bit LLM. |
| BitNet Large | 0.7B | ~200 MB | Smaller model, fast inference. |
| BitNet 3B | 3.3B | ~700 MB | 3.3B parameter 1-bit model. |
| Llama3 8B (1-bit) | 8.0B | ~1.5 GB | Llama3 architecture with 1-bit weights. |
Configuration
BitNet settings are accessible from Settings → CLIs → BitNet:
- Default Model — Select which model to use for new conversations
- Threads — Number of CPU threads for inference (auto-detected)
- Context Size — Maximum context window in tokens (default: 2048)
- Temperature — Controls response randomness (default: 0.7)
Limitations
- No streaming support (responses appear when complete)
- No tool use or function calling
- No vision or multimodal capabilities
- No MCP server support
- Limited model ecosystem compared to Ollama
- Each new model requires a kernel rebuild (~5 minutes)
Uninstallation
To remove BitNet, go to Settings → CLIs → BitNet → Uninstall BitNet. This removes the Conda environment and the ~/.mcclaw/bitnet directory.
Security
McClaw is designed with a security-first approach. It never handles your AI provider credentials and includes multiple layers of protection for system access.
Execution Approvals
When an AI provider wants to execute a shell command, McClaw shows an approval dialog before running it. You can:
- Approve once — Allow the specific command this time only
- Always allow — Add a glob pattern to your allow list so matching commands run without prompts
- Deny — Block the command
Glob-Based Allow Lists
Configure trusted command patterns to reduce approval friction. For example:
# Allow all git commands
git *
# Allow listing files in home directory
ls ~/Documents/*
# Allow Python scripts in a specific project
python /path/to/project/*.py
Environment Sanitization
Before passing commands to CLI processes, McClaw automatically strips sensitive environment variables (API keys, tokens, secrets) to prevent accidental exposure. The sanitization is configured via HostEnvSanitizer and covers common patterns like *_KEY, *_SECRET, *_TOKEN, and provider-specific variables.
Keychain Storage
All connector tokens, OAuth credentials, and other secrets are stored in the macOS Keychain using the Security framework. Secrets are never written to plain text files on disk.
TCC Compliance
McClaw follows Apple's Transparency, Consent, and Control framework. Access to sensitive resources requires explicit user permission through standard macOS prompts:
- Microphone — Required for voice mode
- Camera — Required for Node Mode camera capture
- Screen Recording — Required for Node Mode screen capture
- Location — Required for Node Mode location services
Secure IPC
Remote connections between McClaw instances use Unix sockets with HMAC authentication and length-prefixed binary frames. The HMAC key is derived from a shared secret and includes a UID for session binding.
Remote Access
McClaw can be accessed remotely, allowing you to use it from another machine or network.
Connection Modes
- SSH Tunnel — McClaw creates an SSH tunnel (
ssh -N -L) to forward a local port to the remote McClaw instance. This is the recommended mode for secure remote access over untrusted networks. - Direct Connection — Connect directly to a McClaw instance on your local network. Suitable for trusted environments (e.g. home network).
SSH Tunnel Setup
Configure your SSH target (user, host, port) and the local port to forward.
Use the "Test Connection" button to verify SSH connectivity before activating the tunnel.
Enable the tunnel. The RemoteTunnelManager handles the lifecycle, including automatic reconnection if the tunnel drops.
Connection Mode Coordinator
The ConnectionModeCoordinator manages switching between local and remote modes. When remote mode is active, CLI operations are routed through the SSH tunnel, while local features continue to work normally.
Native Channels
Native Channels allow McClaw to receive messages and respond directly on external messaging platforms. Unlike Connectors (which pull data), Native Channels establish a persistent real-time connection to each platform, enabling two-way communication and automated delivery of scheduled task results.
Supported Platforms
Messaging
- Telegram — Bot API with long-polling. Responds to direct messages and group mentions. Requires a Bot Token from @BotFather.
- Slack — Socket Mode WebSocket connection. Supports DM-only mode and threaded replies. Requires Bot Token (
xoxb-) and App Token (xapp-). - Discord — Gateway WebSocket with full bot presence. Responds in channels where mentioned. Requires a Bot Token from the Discord Developer Portal.
Team & Open Source
- Matrix — Synapse-compatible via
/synclong-poll. Supports any Matrix homeserver. Requires Access Token and Homeserver URL. - Mattermost — WebSocket connection for self-hosted and cloud instances. Requires Personal Access Token and Server URL.
- Rocket.Chat — DDP (Distributed Data Protocol) real-time connection. Requires Auth Token, User ID, and Server URL.
- Zulip — Event queue with long-polling. Requires Bot Email, API Key, and Server URL.
Social & Streaming
- Mastodon — Streaming WebSocket for any Mastodon-compatible instance. Requires Access Token and Instance URL.
- Twitch — EventSub WebSocket for chat messages. Requires OAuth Token and Client ID from the Twitch Developer Console.
Setting Up a Native Channel
Browse the available platforms organized by category.
Each platform requires specific tokens or API keys. Click the platform card to expand the configuration form with platform-specific fields and instructions.
McClaw establishes a persistent connection. The channel status indicator turns green when connected. Messages received on the platform are routed to the active AI provider for response.
Channels & Scheduled Tasks
Native Channels integrate with the Schedules system. When creating a scheduled task, you can select one or more connected channels as delivery targets in Step 4 (Results & Delivery). Each channel destination can have its own recipient (chat ID, channel name, thread, etc.).
How It Works
Each Native Channel maintains a real-time connection using the platform's preferred protocol (WebSocket, long-poll, or event queue). When a message arrives:
- The channel service receives and parses the incoming message
- McClaw routes it to the configured AI provider
- The AI response is sent back through the same channel
All connections auto-reconnect on failure with exponential backoff. Channel credentials are stored securely in the macOS Keychain.
Troubleshooting
CLI Not Detected
PATH. Open Terminal and run the CLI command directly (e.g. claude --version) to verify it works. If the CLI is installed but McClaw does not detect it, try restarting McClaw.
Streaming Not Working
McClaw uses --print --output-format stream-json for Claude CLI streaming. If responses appear all at once instead of streaming:
- Make sure your CLI is up to date
- Check that no shell aliases are interfering with the CLI command
- Try running the CLI directly in Terminal to verify streaming works
Voice Mode Not Working
- Check that microphone permission is granted in System Settings > Privacy & Security > Microphone
- Make sure no other app is exclusively using the microphone
- Try selecting a different input device in the Voice Settings
Connector Authentication Failed
- Google 400 error — Make sure you have entered a valid OAuth Client ID and Client Secret from the Google Cloud Console. See Connectors > Google Workspace Setup for step-by-step instructions.
- Microsoft error — Verify your Azure App Registration has the correct API permissions and a valid client secret.
- Revoke the connector in Settings and re-authenticate
- Check that your browser allows popups from McClaw (needed for OAuth flow)
- If token refresh fails, the connector will prompt you to re-authenticate automatically
Remote Connection Issues
- Verify your SSH credentials are correct and the remote host is reachable
- Check firewall rules if connecting to a remote machine
- Use Settings > Remote to test the connection before enabling remote mode
- Ensure the remote machine has the AI CLIs installed and accessible
Build Issues
- Ensure Xcode 16+ is installed with Swift 6.0
- Always use
./scripts/build-app.sh, notswift buildalone - Run
cd McClaw && swift testto verify the codebase compiles and tests pass - If you see Sparkle-related errors, make sure the framework is available in the build path
Config Reset
If McClaw is behaving unexpectedly, you can reset its configuration:
# Back up your config first
cp ~/.mcclaw/mcclaw.json ~/.mcclaw/mcclaw.json.bak
# Remove config to trigger fresh onboarding
rm ~/.mcclaw/mcclaw.json
McClaw will run the Onboarding Wizard again on next launch.
Getting Help
If you encounter issues not covered here:
- Search existing issues (GitHub Issues will be available when the project goes public)
- Open a new issue with your macOS version, McClaw version, and steps to reproduce
- Include relevant logs from
~/.mcclaw/if available