Wave is an open-source, AI-integrated terminal for macOS, Linux, and Windows. It works with any AI model. Bring your own API keys for OpenAI, Claude, or Gemini, or run local models via Ollama and LM Studio. No accounts required.
Wave also supports durable SSH sessions that survive network interruptions and restarts, with automatic reconnection. Edit remote files with a built-in graphical editor and preview files inline without leaving the terminal.
Key Features
Wave AI - Context-aware terminal assistant that reads your terminal output, analyzes widgets, and performs file operations
Durable SSH Sessions - Remote terminal sessions survive connection interruptions, network changes, and Wave restarts with automatic reconnection
Flexible drag & drop interface to organize terminal blocks, editors, web browsers, and AI assistants
Built-in editor for editing remote files with syntax highlighting and modern editor features
Rich file preview system for remote files (markdown, images, video, PDFs, CSVs, directories)
Quick full-screen toggle for any block - expand terminals, editors, and previews for better visibility, then instantly return to multi-block view
AI chat widget with support for multiple models (OpenAI, Claude, Azure, Perplexity, Ollama)
Command Blocks for isolating and monitoring individual commands
One-click remote connections with full terminal and file system access
Secure secret storage using native system backends - store API keys and credentials locally, access them across SSH sessions
Rich customization including tab themes, terminal styles, and background images
Powerful wsh command system for managing your workspace from the CLI and sharing data between terminal sessions
Connected file management with wsh file - seamlessly copy and sync files between local and remote SSH hosts
Wave AI
Wave AI is your context-aware terminal assistant with access to your workspace:
Terminal Context: Reads terminal output and scrollback for debugging and analysis
File Operations: Read, write, and edit files with automatic backups and user approval
CLI Integration: Use wsh ai to pipe output or attach files directly from the command line
BYOK Support: Bring your own API keys for OpenAI, Claude, Gemini, Azure, and other providers
Local Models: Run local models with Ollama, LM Studio, and other OpenAI-compatible providers
Free Beta: Included AI credits while we refine the experience