Scroll to top

Optimize for Deletion

By Tzvi Melamed

""

I've been thinking a lot lately about a principle I picked up from Chris Sauve during my time at Shopify: optimizing for deletion. 🤔

The idea is simple, but it’s changed the way I write and structure code—especially as I’ve moved from large companies to smaller, faster-paced teams. 🚀

When I first heard the phrase, I was deep in a Rails codebase filled with all the usual suspects: helpers, views, global state, hard-to-follow dependencies. Nothing could be deleted without feeling like pulling a thread from a sweater. I hated it. I wanted clean abstractions. Reuse. DRY everything up.

So I left. I went to Google. ✈️🗺️🏢

Google had everything I wanted: abstractions, testing frameworks, internal tools that did the things I'd dreamed of pitching. But the code still had the same problems—just in a different flavor. We were still organizing by type instead of feature. Cobwebs of dead code 🕸️ still lurked in all the corners of our codebase.

Now I’m working at a smaller company, Blazesoft where I have more say on technical decisions. What I find myself reaching for—more than DRY, more than reuse—is this idea of fractality. I want every feature to be shaped like the whole system. Self-contained. Disposable. 🗑

Why Deletion Matters

Optimizing for deletion isn’t really about deleting, per se. It’s a proxy for something deeper: ease of movement. If code is easy to delete, it’s also easy to move, duplicate, or reshape. And those are the things that make codebases flexible and sustainable. 💪

When code is isolated by feature and contained within a single folder, it becomes easier to copy a working version to try out a new direction. You can duplicate a component, change the logic, and delete it later if it doesn’t work out. You don’t have to worry about untangling imports 🧶, breaking shared utilities 💔, or touching five other files to adjust a single behavior 🐇🕳️.

A Real-World Example

Let’s look at how this can unfold incrementally in a real-world monorepo. 🧑‍💻

Suppose you start with two apps, each with many features:

apps/
  admin/
    src/
      components/
        ProductList.js
        ProductItem.js
        ...
      reducers/
        products.js
        cart.js
        users.js
        ...
      actions/
        productActions.js
        cartActions.js
        userActions.js
        ...
      hooks/
        useProductData.js
        useCartState.js
        useUserData.js
        ...

  storefront/
    src/
      components/
        ProductGrid.js
        ProductTile.js
        ...
      reducers/
        catalog.js
        filters.js
        ...
      actions/
        catalogActions.js
        filterActions.js
        ...
      hooks/
        useCatalogData.js
        useFilterState.js
        ...

Step one: extract a single feature in admin to be feature-scoped.

apps/
  admin/
    src/
      features/
        products/
          components/
            ProductList.js
            ProductItem.js
          state/
            reducer.js
            actions.js
            useProductData.js
      ...

Next, extract a second feature:

apps/
  admin/
    src/
      features/
        products/
          ...
        cart/
          components/
            CartView.js
            CartItem.js
          state/
            reducer.js
            actions.js
            useCartState.js
      ...

Then, suppose both features are using a shared utility utilify.ts. Instead of a global utils folder, move it into a shared feature-scoped module:

apps/
  admin/
    src/
      features/
        shared/
          state/
            utilify.ts

Features import from features/shared/state/utilify.ts. This keeps dependencies visible, local, and encourages cleanup when unused. 🧼😎

Fast forward ⏩: you’ve done this for several features in both admin and storefront. Each app has grown more structured and consistent. At this point, moving to a shared package is obvious and safe: 📈

packages/
  shared/
    features/
      product/
        components/
          ProductDisplay.js
        state/
          reducer.js
          actions.js
          selectors.js
          useProductData.js

      shared/
        state/
          a.ts

Now both apps consume from the shared package 💏🏼, but only after the structure has proven useful locally 👌🏼.

Why It Works Everywhere

This structure lets the organization evolve without fighting the codebase. It encourages ownership and autonomy. And it makes deletion just another tool in your toolbox, instead of a last resort. 🧳🪄

Even though this example is frontend-focused, the underlying principle works just as well across architectural paradigms. Whether you're building backend services in an MVC framework, implementing clean architecture, orchestrating microservices, or writing hardware drivers, the same principle applies: organize your code so it’s safe and easy to delete.

It’s a design principle that scales across stacks and domains.

You Can Start Small

The best part? You don’t have to start over. We didn’t. We applied this incrementally—one feature at a time, one shared utility at a time. It worked.

This doesn’t come for free. It means duplicating things sometimes. Not abstracting prematurely. Choosing clarity over cleverness. 💡

But it pays off. I’ve seen it. 💰👀🎯

And more importantly—I’ve felt it. That sense of lightness when you can delete something with confidence and move forward. 🎆

A Final Word

If I were to give one piece of advice to someone early in their career, it would be this: try applying the idea of optimizing for deletion. It's deceptively simple, but it creates room for experimentation and growth. When your code is easy to throw away, it's easier to learn. You're free to try something, realize it doesn't work, and replace it without friction. You're not stuck with early decisions. Your system evolves with you. 🪴

And while optimizing for deletion isn't a formal architectural pattern, it’s one of those habits that quietly supports many others. It pairs beautifully with ideas like separating interface from implementation—patterns like Adapter, Factory, and Mediator. You’ll see it echoed in modern protocols like the Language Server Protocol (LSP) or Model Context Protocol (MCP). All of them aim for the same thing: loose coupling and easy change. 🔌🏗️

That’s why I think optimizing for deletion is so powerful. It's non-prescriptive. It doesn’t fight your existing architecture. And it doesn’t need you to buy into a grand philosophy. 🧘

Just try it. Use it to create space for learning, iteration, and better teamwork. You don’t have to adopt it all at once. Start with one feature. One utility. One folder. See how it feels. 👣

And when you delete something easily, smile. That’s the point. 😄🗑️✨