Typescripts Native Rewrite in Go
TypeScript’s Native Rewrite: The Go Surprise, The Rust Reality, and What It All Means for You
If you’ve been following the TypeScript ecosystem lately, you’ve probably heard the buzz about the compiler being rewritten in a native language for massive performance gains. And if you’re like a lot of developers, you might have assumed that language was Rust. After all, Rust has been eating the JavaScript tooling world for the past few years. SWC, Biome, Oxc, Rolldown… the list goes on.
But here’s the twist: Microsoft chose Go.
That decision sparked one of the most spirited language debates in recent memory. Let’s dig into what’s actually happening, why Go won the day, where Rust fits into the bigger picture, and what all of this means for your day-to-day development workflow.
The Problem: TypeScript Hit a Wall
TypeScript’s compiler (tsc) has always been written in TypeScript itself, running on Node.js. For years, that was fine. But as codebases grew into the millions of lines, developers started feeling real pain. Compile times stretched into minutes. Editor responsiveness degraded. Memory usage ballooned, sometimes crashing entirely on large monorepos.
As Anders Hejlsberg, TypeScript’s creator, put it: JavaScript was designed for UI and browser usage, not for compute-intensive workloads like compilers. The team had squeezed about as much performance out of JavaScript as they could. Something had to give.
Project Corsa: The 10x Promise
In March 2025, Microsoft announced Project Corsa, a port of the TypeScript compiler to Go. Not a ground-up rewrite, but a line-by-line port of the existing codebase. The early benchmarks were jaw-dropping. Compiling the VS Code codebase (roughly 1.5 million lines of TypeScript) dropped from about 78 seconds to around 7.5 seconds. Other large projects like Playwright and TypeORM saw similar 10x to 13x improvements.
By December 2025, the project had matured significantly. The native language service (the engine behind your editor’s autocomplete, go-to-definition, and error highlighting) was stable enough for daily use. Type-checking was nearly complete. Microsoft started shipping nightly preview builds through an npm package and a VS Code extension. Teams inside and outside Microsoft were already using it for real work.
The new compiler will ship as TypeScript 7.0, while the current JavaScript-based compiler continues as the TypeScript 6.x line during the transition period.
Wait, Why Not Rust?
This is the question that lit up every corner of the internet. Rust is fast. Rust is safe. Rust is already powering the next generation of JavaScript tooling. So why did the TypeScript team pass on it?
Hejlsberg addressed this directly. The answer comes down to pragmatism over ideology.
Porting vs. rewriting. The TypeScript compiler is roughly 100,000 lines of code with deeply established patterns. The team chose to port (translate the existing code structure) rather than rewrite from scratch. Go’s programming model maps closely to how the TypeScript compiler is already written: functions and data structures, very few classes, a functional style. Rust’s ownership model, borrow checker, and lifetime system would have required fundamentally restructuring the entire codebase. As Ryan Cavanaugh, TypeScript’s development lead, noted: Rust “succeeds wildly at its design goals,” but easy portability from a JavaScript codebase was never one of those goals.
Cyclic data structures. Compiler internals are full of cyclic references. Nodes in an AST point to their parents, which point back to their children. Go handles this naturally with garbage collection. Rust’s ownership model effectively outlaws cyclic data structures without resorting to reference counting or unsafe code, which would have added significant complexity.
Garbage collection. The existing TypeScript compiler relies heavily on garbage collection for memory management. Go provides this out of the box. Moving to Rust would have meant manually managing memory throughout the entire codebase or introducing patterns that fight against Rust’s core design philosophy.
Concurrency. A big chunk of the 10x speedup comes from parallelism. Go’s goroutines made it straightforward to parallelize parsing, binding, and type-checking across multiple cores. Hejlsberg noted that roughly half the performance gain came from native code execution, and the other half came from concurrent processing.
Team velocity. A full Rust rewrite could have taken years with uncertain compatibility. The Go port delivered a working, highly compatible compiler in under a year.
In short: Go was the fastest path to a 10x improvement with minimal risk of breaking the ecosystem. It wasn’t about Go being “better” than Rust in absolute terms. It was about Go being the right tool for this specific job.
The WASM Concern: Where Rust Would Have Shined
Not everyone was convinced. Evan You, the creator of Vue.js and the force behind VoidZero’s Rust-based Oxc toolchain, raised a legitimate concern: Go’s WebAssembly performance is notoriously poor.
This matters because there are real use cases for running the TypeScript compiler in the browser. Online playgrounds, browser-based IDEs like StackBlitz, documentation sites with interactive code examples. These all depend on compiling TypeScript client-side. Early tests showed that a basic type check using the Go-based compiler compiled to WASM took over a second, which is potentially slower than the current JavaScript-based tsc.
This is an area where Rust would have had a clear advantage. Rust produces small, efficient WASM binaries with near-native performance. Go’s WASM output carries a roughly 2MB runtime overhead and has not received the same level of optimization in browser contexts.
Microsoft hasn’t fully addressed this gap yet. The likely outcome is that browser-based tools will continue using the JavaScript-based TypeScript compiler (or one of the Rust-based alternatives) for the foreseeable future, while the Go-based tsgo handles CLI builds, CI pipelines, and editor integration where it runs natively.
Rust’s Real Role: Owning the Tooling Layer
Here’s the thing: even though Rust didn’t win the compiler rewrite, it’s arguably winning the broader war for JavaScript and TypeScript tooling. The ecosystem of Rust-based tools has exploded, and these tools are already transforming daily development workflows.
SWC (Speedy Web Compiler) was the pioneer. Built by Donny (kdy1), it’s a Rust-based replacement for Babel that handles transpilation at native speeds. It’s used under the hood by Next.js, Parcel, and Deno. SWC proved that Rust could deliver order-of-magnitude improvements for JavaScript tooling.
Oxc (The JavaScript Oxidation Compiler) is the next evolution, built under Evan You’s VoidZero umbrella. It’s a modular collection of Rust-based tools sharing a single high-performance parser. The benchmarks are staggering: the parser is 3x faster than SWC and 5x faster than Biome. The transformer is 40x faster than Babel. Oxlint runs 50 to 100x faster than ESLint. And the recently announced Oxfmt formatter is 30x faster than Prettier and 3x faster than Biome.
Biome emerged from the Rome project as a unified linter and formatter written in Rust. It offers sensible defaults and a single configuration file, simplifying what used to require coordinating ESLint, Prettier, and a handful of plugins.
Rolldown is a Rust-based bundler powered by Oxc, designed as a drop-in replacement for Rollup. It’s on track to become the default bundler for Vite, which would bring Rust-powered bundling to one of the most popular frontend build tools.
Rspack and Turbopack are Rust-based alternatives to Webpack, both powered by SWC. They’re targeting the large segment of the ecosystem that still depends on Webpack’s plugin architecture but desperately needs faster builds.
Together, these tools are replacing nearly every layer of the traditional JavaScript toolchain: transpilation, bundling, linting, formatting, and module resolution. The common thread is Rust.
What This Means for Your Workflow
So what should you actually expect as a working developer? Here’s the practical breakdown.
Faster everything in your editor. TypeScript 7’s native language service means your autocomplete, error checking, hover tooltips, and refactoring tools will respond dramatically faster. Large monorepos that used to choke your editor should become responsive. This alone might be the most impactful change for day-to-day productivity.
Dramatically shorter CI times. If your CI pipeline spends minutes on type-checking, expect that to drop to seconds. For teams running type checks on every pull request across large codebases, this translates directly into cost savings and faster iteration cycles.
A transition period, not a switch. TypeScript 6 (JavaScript-based) and TypeScript 7 (Go-based) will coexist. The Go-based compiler doesn’t yet support the full Strada API that many third-party tools depend on. Linters, formatters, and IDE extensions that rely on TypeScript’s compiler API will need to adapt to the new Corsa API. This will take time.
Strict mode becomes the default. TypeScript 7 enables strict mode by default. If your project has been running with relaxed settings, you’ll need to either tighten up your types or explicitly opt out. This is a breaking change that will surface new errors in some codebases.
Browser tooling stays JavaScript and Rust for now. If you maintain a TypeScript playground, browser-based IDE, or any tool that runs tsc client-side, you’ll likely stick with the JavaScript-based compiler or adopt a Rust-based alternative like Oxc for parsing and analysis.
The Rust toolchain is ready to adopt today. You don’t need to wait for TypeScript 7 to start benefiting from native tooling. Oxlint can supplement or replace ESLint right now. Biome can handle both linting and formatting. If you’re on Vite, Rolldown integration is coming. SWC is already baked into Next.js. These tools are production-ready and offer immediate performance gains.
The Bigger Picture
What we’re witnessing is a fundamental shift in how JavaScript and TypeScript tooling is built. The days of writing performance-critical developer tools in JavaScript are ending. The question isn’t whether native tooling will replace JavaScript-based tooling. It already is.
The interesting nuance is that this native revolution isn’t monolithic. It’s split between Go and Rust, each carving out different niches. Go won the TypeScript compiler because it was the pragmatic choice for porting a large, existing codebase with cyclic data structures and garbage collection needs. Rust is winning everything else because its performance characteristics, memory safety, and excellent WASM support make it ideal for building new tools from scratch.
For developers, this is overwhelmingly good news. Faster builds. Faster editors. Faster CI. Less waiting. More building. The tooling is getting out of your way, which is exactly what good tooling should do.
The TypeScript compiler rewrite wasn’t a Rust story. But the future of TypeScript tooling absolutely is. And both of those facts can be true at the same time.
Jeff Kershner
Engineering Leader & Lifelong Coder | Co-Founded AI Startup Deployed in 1,200+ Retail Locations