Overview
GitHub Copilot autocompletes your code. Type a comment describing what you want, and it writes the function. Start a line, and it finishes the thought. It works across dozens of languages and integrates directly into VS Code, JetBrains, Neovim, and other editors. Microsoft charges $10-39/month depending on the plan.
The technology is built on OpenAI’s Codex models (now GPT-4 derivatives), trained on billions of lines of public code from GitHub repositories. This is the part that gets awkward: Microsoft owns GitHub, which hosts the code, and Microsoft invested $13 billion in OpenAI, which trained on that code. The developers who wrote that code were never asked permission and receive no compensation.
A class-action lawsuit filed in 2022 challenged this. It’s still winding through courts. Meanwhile, Copilot has become the fastest-growing product in Microsoft’s history.
What It Knows About You
Copilot sees your code. All of it. Every file open in your editor, your repository structure, your commit messages, your comments. For individual users, Microsoft retains code snippets and prompts to improve the model unless you opt out. Business and Enterprise plans offer stronger data protections — your code isn’t used for training.
The telemetry is extensive. Microsoft tracks which suggestions you accept, which you reject, how long you pause before accepting, what you edit after accepting. They know your coding patterns better than you do.
If you’re working on proprietary software on an individual plan without the right settings, you’re feeding trade secrets into Microsoft’s training pipeline. Many companies ban Copilot for exactly this reason.
The Real Risks
Intellectual property is the central concern. Copilot occasionally reproduces substantial chunks of existing code verbatim — including code with GPL, AGPL, or other copyleft licenses. If Copilot pastes GPL code into your proprietary project, you may have a legal problem. Microsoft added a filter to block exact matches, but near-matches slip through.
Job threat is significant and growing. Junior developers feel this most. Companies are already reducing hiring for entry-level positions because senior developers with Copilot can handle tasks that previously required a team. The irony cuts deep: the tool trained on junior developers’ code is now used to justify not hiring junior developers.
Security vulnerabilities are a real problem. Stanford researchers found that developers using AI code assistants produce less secure code and are more confident about it. Copilot suggests SQL injection vulnerabilities, hardcoded credentials, and insecure cryptographic patterns. It doesn’t know what’s dangerous — it knows what’s common, and common code is often bad code.
Skill atrophy is measurable. Programmers who rely on Copilot for six months show decreased ability to solve problems without it. The tool is a cognitive crutch. Great for productivity today, potentially devastating for the profession long-term.
Alternatives
- Codeium / Cody (Sourcegraph): Free or cheaper alternatives with similar functionality and sometimes better privacy policies.
- Continue.dev: Open-source code assistant that works with local models. Your code never leaves your machine.
- TabNine: Offers a local model option that runs entirely on-device.
- Reading documentation: Slower. More reliable. Builds actual understanding.
Our Verdict
Copilot earns a C — not because it’s safe, but because the risks are manageable if you configure it correctly. Use the Business plan, disable telemetry, enable the license filter, and review every suggestion before accepting. The job displacement concern is serious but broader than one product. The IP concerns are real and unresolved. It’s a powerful tool with genuine tradeoffs. Just don’t pretend autocomplete is the same as understanding.