Sourcegraph Cody is an AI-powered coding assistant developed by Sourcegraph, the company behind the widely used code intelligence platform. Cody is designed to help developers understand, write, and fix code faster by leveraging a deep understanding of your entire codebase, not just open files. It integrates directly into popular Integrated Development Environments (IDEs) and uses Sourcegraph's code graph and search capabilities to provide highly context-aware assistance.
Cody acts as an intelligent partner that can generate code, answer questions about your code, automate common tasks, explain complex logic, and help with debugging, all while respecting your code's privacy and security. It supports a wide array of programming languages and offers flexibility in choosing underlying Large Language Models (LLMs).
Sourcegraph Cody offers a comprehensive suite of AI-driven tools for developers:
- Deep Codebase Context: Cody's primary differentiator is its ability to use Sourcegraph's code graph to fetch context from your entire codebase (local files, entire repositories, or even multiple repositories when connected to a Sourcegraph instance). This enables more accurate and relevant AI assistance.
- AI Code Completion & Generation ("Auto-edit"):
- Provides intelligent, context-aware single and multi-line code completions as you type.
- Generates code snippets, functions, or entire files from natural language prompts or comments.
- Cody Chat (Agentic Chat):
- An interactive AI chat interface within your IDE.
- Ask questions about your code (e.g., "How does this function work?", "Where is this variable used?", "What's the purpose of this class?").
- Get explanations of complex code blocks.
- Generate unit tests.
- Debug code by describing issues and getting suggestions for fixes.
- Refactor code with AI guidance.
- Generate documentation (docstrings, comments).
- The chat can autonomously gather and refine context from your codebase to provide more accurate answers.
- Cody Commands (Pre-built & Custom):
- Built-in Commands: Offers a set of predefined, reusable prompts for common tasks, such as:
/explain
: Explains the selected code.
/test
: Generates unit tests for the selected code.
/doc
: Generates documentation for the selected code.
/smell
: Identifies code smells or areas for improvement in the selected code.
/fix
: Suggests fixes for issues in the selected code (in-line edits).
- Custom Commands: Users can create their own reusable commands with custom prompts and context configurations (saved in
cody.json
files at project or user level).
- In-line Edits & Fixes:
- Highlight code and use hotkeys (e.g.,
Opt+K
or Alt+K
) to ask Cody to edit, refactor, or fix it directly in your editor, with changes presented as a diff.
- Natural Language Code Search: Leverages Sourcegraph's search capabilities combined with AI to help you find code snippets, understand concepts, or locate definitions across your entire codebase using natural language queries.
- Sourcegraph Agents:
- AI-powered agents designed to automate more complex tasks and workflows across the Software Development Life Cycle (SDLC), such as rule-based code reviews, large-scale code migrations, and custom API-driven automations (primarily an enterprise-focused feature).
- IDE Integration:
- Works as an extension within popular IDEs, including:
- Visual Studio Code (VS Code)
- JetBrains IDEs (IntelliJ IDEA, PyCharm, GoLand, WebStorm, PhpStorm, Rider, CLion, DataGrip, RustRover, Aqua, DataSpell)
- Neovim
- Experimental support for Visual Studio and Eclipse.
- Broad Language & Framework Support: Supports virtually any programming language or framework, as it learns from the code it's analyzing.
- Choice of LLMs (Plan-dependent):
- Allows selection from various powerful LLMs, including:
- Anthropic Claude series (e.g., Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 Haiku)
- OpenAI GPT series (e.g., GPT-4o, GPT-4 Turbo)
- Google Gemini series (e.g., Gemini 1.5 Pro, Gemini 1.5 Flash)
- Mixtral
- Support for connecting to local Ollama models (for offline/private use).
- Enterprise plans may offer "Bring Your Own LLM Key" options.
- Security & Privacy Focus:
- Enterprise-Grade Security: For self-hosted or dedicated cloud Sourcegraph Enterprise instances, Cody offers full data isolation, zero retention of code snippets or embeddings by LLM partners, and no training on customer code.
- Sourcegraph.com (Cloud): Sourcegraph has agreements with LLM partners for zero retention and no training on Cody inputs/outputs.
- Detailed audit logs and controlled access are available for enterprise deployments.
Sourcegraph Cody is particularly effective for:
- Understanding Large & Complex Codebases: Quickly get explanations of unfamiliar code, trace dependencies, and understand how different parts of a system interact.
- Onboarding New Developers: Helps new team members get up to speed faster by allowing them to ask questions about the codebase in natural language.
- Accelerating Code Writing & Refactoring: Generate boilerplate code, complete complex functions, and refactor existing code more efficiently.
- Improving Code Quality & Consistency: Identify code smells, get suggestions for improvements, and generate unit tests to ensure code reliability.
- Generating Documentation: Automatically create docstrings and comments for functions and classes.
- Debugging: Ask Cody for help in diagnosing errors and get suggestions for fixes.
- Finding Relevant Code Examples: Use natural language search to find examples of how to use specific APIs or implement certain patterns within your own codebase.
- Automating Repetitive Tasks: Leverage Cody Agents and custom commands to automate common developer workflows.
- Large-Scale Code Changes: Used in conjunction with Sourcegraph Batch Changes, AI assistance can help in planning and executing widespread refactors or updates.
Getting started with Sourcegraph Cody involves installing the extension and connecting it to Sourcegraph:
- Sign Up/Log In:
- Cody Free/Pro: Create an account on Sourcegraph.com.
- Cody Enterprise: Your organization will have a Sourcegraph Enterprise instance (self-hosted or dedicated cloud).
- Install the Cody IDE Extension:
- Open your preferred IDE (e.g., VS Code, IntelliJ).
- Go to the extension marketplace and search for "Sourcegraph Cody."
- Install the extension.
- Connect Cody to Sourcegraph:
- After installation, the Cody extension will prompt you to sign in.
- Free/Pro users: Sign in with your Sourcegraph.com account.
- Enterprise users: Connect to your organization's Sourcegraph Enterprise instance URL and authenticate.
- Using Cody Chat:
- Open the Cody chat panel/sidebar in your IDE (often accessible via a Cody icon in the activity bar).
- Type your questions about your code, ask for explanations, or request code generation.
- You can mention specific files or symbols (e.g.,
@filename.go
or @myFunction
) to provide Cody with more precise context.
- Using Code Completions (Auto-edit):
- As you type code, Cody will automatically provide inline suggestions (often grayed out).
- Press
Tab
to accept a suggestion.
- Press
Esc
to dismiss it.
- Using Cody Commands:
- Select a piece of code.
- Open the Cody command palette (often via a right-click context menu or a keyboard shortcut) and choose a pre-built command like
/explain
, /test
, /doc
, or run custom commands.
- Alternatively, type commands directly into the Cody chat interface.
- Using In-line Edits & Fixes:
- Select the code you want to modify.
- Use the appropriate hotkey (e.g.,
Opt+K
on macOS, Alt+K
on Windows) or right-click and select the "Edit Code with Cody" (or similar) option.
- Describe the changes you want in natural language. Cody will generate a diff for you to review and apply.
- Configuring Cody:
- Access Cody settings within the IDE extension to manage preferences, such as default LLMs (if your plan allows choices), custom commands (
cody.json
), and connection to your Sourcegraph instance.
Sourcegraph Cody offers several tiers:
- Cody Free:
- Cost: $0.
- Target User: Individual developers, hobbyists, light usage.
- Features: Unlimited autocomplete suggestions. Limited chat messages and prompts per month (e.g., 200). Access to several LLM choices for chat (e.g., Claude 3.5 Sonnet, Gemini 1.5 Pro/Flash, Mixtral). Ability to connect to local Ollama models. Context from your local codebase. Community support.
- Deployment: Multi-tenant Cloud (Sourcegraph.com).
- Cody Pro:
- Cost: ~$9 USD per user per month.
- Target User: Individual professional developers.
- Features: Unlimited autocomplete and chat messages/prompts. Access to more powerful LLMs for chat (all models in Free, plus options like GPT-4o). Context from your local codebase. Community support.
- Deployment: Multi-tenant Cloud (Sourcegraph.com).
- Cody Enterprise Starter:
- Cost: ~$19 USD per user per month.
- Target User: Growing organizations.
- Features: Unlimited autocomplete, chat messages, and prompts. Intent detection and integrated search results from Sourcegraph. Private workspace. Privately indexed code (limits may apply).
- Deployment: Multi-tenant Cloud.
- Cody Enterprise:
- Cost: Starting from ~$59 USD per user per month (annual commitment often required, contact sales for custom pricing).
- Target User: Larger organizations needing enterprise-level security, scalability, and flexibility.
- Features: All features of Pro/Enterprise Starter plus:
- Full Codebase Context: Deep context from your organization's entire codebase via a Sourcegraph Enterprise instance. Multi-repo context support.
- Flexible LLM Choices: Access to a wider range of LLMs, including potentially bringing your own LLM key (BYOK) for certain models.
- Advanced Security & Compliance: Enhanced security controls, policy management, full data isolation, zero retention by LLM partners.
- Deployment Options: Dedicated Cloud or Self-Hosted Sourcegraph instance.
- Sourcegraph Agents: Access to AI agents for automating complex workflows.
- IP Indemnification: Full IP indemnification under specific terms.
- Dedicated support.
Note: Specific limits, LLM availability per plan, and pricing details are subject to change. Always refer to the official Sourcegraph Cody pricing page (https://sourcegraph.com/cody/pricing or https://sourcegraph.com/pricing) for the most current information.
- Code Ownership (Enterprise): For Sourcegraph Enterprise AI tools, "as between the parties, you own all Inputs to and Outputs generated by your use of Sourcegraph. You retain ownership of your code and responsibility for ensuring any code snippets emitted by Sourcegraph comply with software licenses and copyright law."
- IP Indemnification (Enterprise): Sourcegraph offers full IP indemnification for Enterprise customers against claims that the use of AI Tools or Outputs infringes third-party IP rights, provided users adhere to terms (e.g., using current versions and provided filters).
- LLM Data Handling: Sourcegraph states that their partner LLMs (e.g., Anthropic, OpenAI) will not retain any Input or Output from the model (including embeddings) beyond the time it takes to generate the Output ("Zero Retention") for Cody's operations. Enterprise deployments offer further isolation.
- Responsibility: Users are ultimately responsible for the code they write and commit, including reviewing AI-generated suggestions for accuracy, security, and compliance with licensing.
Always consult the official Sourcegraph Terms of Service and Enterprise AI Terms for the most accurate and detailed information.
Q1: What is Sourcegraph Cody?
A1: Sourcegraph Cody is an AI coding assistant that understands your entire codebase by leveraging Sourcegraph's code intelligence platform. It provides features like context-aware code completion, AI chat, custom commands, in-line edits, and natural language code search to help developers write, understand, and fix code more efficiently.
Q2: How is Cody different from other AI coding assistants?
A2: Cody's main differentiator is its deep understanding of your entire codebase. By connecting to a Sourcegraph instance (especially for Enterprise users), Cody can fetch context from across all your repositories, not just open files, leading to more accurate and relevant AI assistance for complex projects.
Q3: What Large Language Models (LLMs) does Cody use?
A3: Cody offers flexibility in LLM choice depending on the plan. It supports models from Anthropic (e.g., Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 Haiku), OpenAI (e.g., GPT-4o), Google (e.g., Gemini 1.5 Pro, Gemini 1.5 Flash), Mixtral, and allows connecting to local Ollama models. Enterprise plans may offer more choices and "Bring Your Own Key" options.
Q4: What IDEs does Sourcegraph Cody support?
A4: Cody supports VS Code, JetBrains IDEs (IntelliJ, PyCharm, GoLand, WebStorm, Rider, CLion, etc.), and Neovim, with experimental support for Visual Studio and Eclipse.
Q5: Is Sourcegraph Cody free?
A5: Yes, Sourcegraph Cody offers a Cody Free tier for individual developers with unlimited autocomplete and a generous monthly allowance for chat and commands. Paid Cody Pro and Cody Enterprise tiers provide more features, higher limits, full codebase context (Enterprise), and advanced administrative/security controls.
Q6: How does Cody handle my code privacy and security?
A6: Sourcegraph emphasizes security and privacy.
* Enterprise: Offers options for self-hosted or dedicated cloud deployments with full data isolation, zero retention by LLM partners, and no training on customer code.
* Cloud (Free/Pro): Sourcegraph has agreements with its LLM partners for zero retention of prompts/responses and no training on this data for their models.
Users should always review Sourcegraph's official security and privacy documentation.
Q7: Can Cody understand my entire private codebase?
A7: Yes, this is a core capability, especially for Cody Enterprise users connected to a Sourcegraph Enterprise instance. Sourcegraph indexes your private repositories, creating a code graph that Cody uses to retrieve highly relevant context for its AI features.
Q8: What are Cody Commands?
A8: Cody Commands are reusable, prompt-based actions for common coding tasks. Cody comes with pre-built commands like /explain
, /test
, and /doc
. Users can also create custom commands tailored to their specific workflows and project needs by defining them in a cody.json
file.
Sourcegraph places a strong emphasis on security and responsible AI practices for Cody:
- Contextual Privacy: Enterprise deployments (self-hosted or dedicated cloud) ensure that code context remains within the customer's control and is not used to train general-purpose LLMs.
- Zero Retention by LLM Partners: Sourcegraph has agreements with its LLM partners ensuring that data (prompts, code snippets, responses) sent via Cody is not retained by the LLM providers and is not used to train their public models.
- Transparency & Control: Users often have choices regarding LLM selection (plan-dependent) and can see the context Cody is using.
- IP Protection: The Enterprise plan includes IP indemnification.
- Responsible Use: Users are encouraged to review AI-generated code critically for accuracy, security, and appropriateness.