Accelerating modern engineering teams
Supercharge your team’s development workflows with AI, security, and collaboration built in
Trusted by over 16,000 engineering teams at leading companies
56%
Fortune 500 engineering teams rely on Warp
187k
Engineering hours saved each month
70%
Faster onboarding of engineers
“Warp has been a game-changer for my team. We’re focused on integrating AI not just for efficiency, but to enhance how our engineers collaborate. Agent Mode and Warp Drive save each engineer hours per week.”
Engineering Lead at Public Fintech Co
Built for modern enterprise

Enterprise-grade SLAs, ensuring reliability when it matters most.
SAML-based SSO, role based access control for permission management, and SOC2 Type 2 compliance.
Your Data, Your Control
Industry leading data policies including zero data retention – we will not retain or use any of your data. Secret Redaction and Secret Regexes to prevent any sensitive data being leaked. You can also bring your own LLM and Agent Mode supports configurable autonomous execution.
Advanced Insights & Performance Analytics
Real-time analytics and customizable reporting. Command usage metrics, audit logs & compliance tools for governance.
Premium Support, Anytime You Need It
Get premium support with a dedicated account manager, personalized onboarding, and 24/7 assistance to quickly resolve any issues.
“Coming from using the standard terminal in macOS, ubuntu, etc. I am just blown away how much efficiency my team has gotten from Warp in just a few days. I can't stop raving about it!”
VP of Engineering at Fortune 100 Tech Co
The AI tool of choice for engineering teams

Multi-thread your entire team
AI-first developers reclaim over an hour a day—gaining 7-8 hours back each week.
Unlimited AI, for everyone
Each team member gets unlimited AI and over [100M] tokens per month. Never run into limits and unblock your engineers.
Compatible and integrated
Warp is platform agnostic and available across Windows, Mac, Linux, and Web. Warp suggests commands, switches, and arguments for over 400 CLI tools and supports of [50] languages.
Context aware across your team
Agent Mode and Warp’s AI has context across your team’s codebase and commands, making it hyper personalized for your development.
Powerful collaboration capabilities
Save and share commands and workflows across your team with Warp Drive. Unlimited Notes, Session Sharing, Block Sharing, and workflows is the homebase for your engineering team.
Access leading models
Warp brings leading models from OpenAI, Anthropic, Gemini, and Fireworks (DeepSeek Model Provider) directly to your command line.
Frequently asked questions
Zero Data Retention is available only to our Enterprise plan customers. If you select Zero Data Retention, Warp will implement our Zero Data Retention Standard, meaning that:
- Warp will not retain or otherwise use for product improvements any text, commands, code, outputs, prompts, or other content submitted, transmitted, or generated by you or any of your authorized users through Warp’s terminal application or AI features.
- Warp will not retain any of your terminal content or AI inputs/outputs (including conversational data from use of our AI features).
- Warp will not persistently store, retain, or analyze any of the data you provide through your use of AI features in Warp beyond the immediate execution of the AI requests.
Warp may, however, collect and retain high-level feature utilization data, including analytics and crash reports, subject to the terms of the Enterprise Plan agreement. Please contact your Warp sales representative to discuss whether you’d like high-level reporting for your account.
Warp integrates with multiple Large Language Model (LLM) providers to power its AI-driven features. These providers include, but are not limited to:
- Anthropic
- OpenAI
- Fireworks (DeepSeek Model Provider)
Warp has executed Zero Data Retention agreements with such LLM providers. This means that if you select Zero Data Retention:
- The LLM providers commit not to train their models on any customer-generated data processed through Warp’s services.
- The LLM providers commit to delete inputs and outputs following the generation of the relevant output
- Warp will implement technical and contractual safeguards with LLM providers to execute this commitment.
- Agent Mode: Agent Mode is an interactive chat-based mode in Warp that lets you perform any terminal task with natural language. Agent Mode is explicitly invoked by the user and chat, and any supplied context (such as files, user query, or attached command output) is proxied via Warp’s server to the selected LLM provider.
- Active AI: Active AI proactively sends terminal errors, inputs, and outputs to Warp AI to proactively suggest fixes and next actions. Sensitive information is redacted client side before sending any sensitive terminal content (see below). Active AI can be disabled via an organizational policy.
Secret Redaction is automatically applied to any content sent to AI features to prevent any sensitive data being leaked. On the enterprise plan, Secret Regexes can be set for the entire team.
Agent Mode may request access to code files in response to user queries. Agent mode can be configured to always ask for permission to access files or be enabled for automatic file access depending on your organizational policies.
All AI requests and responses are currently proxied through Warp’s servers (GCP, US-based), so that prompts and responses can be correctly parsed and constructed.
Warp supports LLMs hosted by your company, please check with your Warp contact to determine if your hosted model is compliant with Agent Mode API (Anthropic and OpenAI are currently supported). Note that at the moment, AI requests are still proxied through Warp Server. Zero Data Retention Guarantees are automatically enabled when providing your own LLM endpoint.