I Scanned My MacBook and Found 127GB of Developer Junk — Here's What It Was

I ran a full disk scan on my 512GB MacBook and found 127GB of developer junk. Xcode, Docker, node_modules, Rust, and more.

Last week I ran MegaCleaner on my development MacBook for the first time in months. It found 127GB of junk. Not “maybe junk” — actual build artifacts, stale caches, and orphaned data from tools I use every day.

My 512GB drive had 47GB free. After cleaning, I had 168GB free. That’s a quarter of my entire disk reclaimed.

Here’s the full breakdown of what was hiding on my machine.

The Scan Results

MegaCleaner scans 29 categories (21 developer tools + 8 general system categories). On my machine, almost everything had something to report. Here’s what it found, from largest to smallest.

Xcode DerivedData + Simulators — 42GB

This is always the big one. I do iOS work across several projects, and Xcode is relentless about caching.

  • DerivedData: 16GB across about 30 project folders, most of which I haven’t opened in months. Xcode never cleans these up. Ever.
  • Old simulator runtimes: 19GB. I had iOS 17.2, 17.5, and 18.0 runtimes still installed. I’m building against iOS 18.4 now. Those are dead weight.
  • Device support files: 5GB. Debug symbol caches from old iOS versions on devices I connected once.
  • Stale archives: 2GB. Old .xcarchive bundles from builds I submitted to App Store Connect months ago.

DerivedData alone is the single easiest win on any developer’s Mac. It rebuilds automatically. There’s zero risk in deleting it.

Docker Images and Build Cache — 31GB

Docker Desktop on macOS stores everything inside a single Docker.raw disk image, which makes it invisible to most disk analyzers. You have to actually query Docker to see what’s inside.

My breakdown:

  • Unused images: 18GB. Old base images, intermediate layers from builds I ran weeks ago, images for projects I’m no longer working on.
  • Build cache: 11GB. Docker keeps every build layer cached. Great for rebuild speed, terrible for disk space when you have dozens of Dockerfiles.
  • Stopped containers and dangling volumes: 2GB. Easy to forget about.

The thing about Docker is that docker system prune exists, but most people never run it. And even when they do, it doesn’t remove images that still have tags. You need docker system prune -a for that, and most people are (reasonably) nervous about running it.

node_modules Across 23 Projects — 18GB

This one is a classic. I counted 23 separate node_modules directories on my machine. Some of them were in projects I haven’t touched in over a year.

The average node_modules directory was about 780MB. The largest was 2.1GB — a Next.js project with a heavy dependency tree including Playwright browsers.

Here’s the thing: node_modules is fully reproducible. npm install or yarn install recreates it from the lockfile. There is no reason to keep node_modules for a project you’re not actively working on today. None.

I kept 4 (active projects). Deleted 19.

Rust target/ Directories — 14GB

Rust compile artifacts are large. This is well-known in the Rust community — a single project’s target/ directory can hit 5-10GB easily, especially in debug mode.

I had 6 Rust projects. Two of them I’m actively developing. The other four had target/ folders sitting there doing nothing, totaling about 10GB. Even for the active projects, the target/debug directories had artifacts from dependencies I’d since removed.

cargo clean exists but you have to run it per-project. And you have to remember.

Python Conda Environments — 9GB

I keep conda around for ML experiments. The problem is that conda environments are self-contained — each one bundles its own copy of Python plus every dependency. That’s great for isolation, terrible for disk space.

I had 5 conda environments. Two were from experiments I ran once in late 2025 and never looked at again. Each environment was about 2GB. The pip cache added another 2GB on top.

Homebrew Cache — 4GB

Every time Homebrew installs or upgrades a package, it downloads the source/bottle and caches it. These caches stick around indefinitely. brew cleanup is supposed to help, but I hadn’t run it in 4 months.

4GB isn’t dramatic, but it’s free money — there’s no reason to keep old Homebrew downloads.

IDE Caches (VS Code, JetBrains) — 3GB

I use VS Code daily and occasionally fire up a JetBrains IDE for Java work. Both cache aggressively.

  • VS Code extensions, workspace storage, cached data: 1.8GB
  • JetBrains system caches and local history: 1.2GB

Nothing critical in any of it. IDE caches rebuild seamlessly.

Browser Caches — 2GB

Chrome and Arc combined. Not developer-specific, but it all counts. These regenerate automatically as you browse.

System Caches and Logs — 4GB

macOS system logs, crash reports, temporary files, expired caches under ~/Library/Caches. The kind of stuff that accumulates silently on every Mac, developer or not.

The Surprises

A few things caught me off guard.

Docker was bigger than I expected. I knew I had some images lying around, but 31GB? I had mentally budgeted for maybe 10-15GB. The build cache was the hidden killer — I didn’t realize how much Docker was hoarding in layer cache.

Rust’s target/ directories were surprisingly large for such few projects. Six projects, 14GB. That’s over 2GB average per project. Debug builds in Rust are aggressively unoptimized (intentionally — for faster compile times), which means the artifacts are huge.

Xcode was not a surprise. Anyone who’s done iOS development for more than a year knows DerivedData is a storage black hole. The simulators were the real problem — 19GB for runtimes I’ll never use again.

node_modules was exactly what I expected — which is its own kind of sad. We’ve all accepted that JavaScript projects casually consume a gigabyte of disk space each. It doesn’t have to be that way (keeping them around for stale projects, I mean).

What I Cleaned vs. What I Kept

This is where MegaCleaner’s confidence levels were useful. Every item it finds is tagged with a confidence level:

  • Definite — 100% safe to delete. Caches, build artifacts, derived data. Regenerates automatically on next use.
  • Probable — Very likely safe. Old archives, stale environments, unused images. Check if you’re unsure.
  • Possible — Worth reviewing first. Things like large unknown files or old Xcode versions that might still be needed.

I cleaned everything marked “definite” without hesitation — that was about 95GB. For “probable” items, I skimmed the list and cleaned most of them (another 26GB). The few “possible” items (6GB) I reviewed individually and cleaned about half.

Total cleaned: ~121GB out of the 127GB found.

What I kept:

  • DerivedData for 2 active Xcode projects (rebuild takes 8 minutes each)
  • node_modules for 4 active projects
  • target/ for 2 active Rust projects
  • My current conda environment for an ongoing ML project

Everything else went to Trash. (MegaCleaner moves to Trash, not permanent delete — so you can always undo if something feels wrong.)

The Before and After

BeforeAfter
Used space465 GB344 GB
Free space47 GB168 GB
Cleaned121 GB

I went from “I need to buy a new laptop” to “I’m fine for another year.”

A 512GB drive with 47GB free is stress. You’re getting macOS warnings, Docker complains, Xcode builds fail because there’s no room for intermediate files. At 168GB free, everything just works again.

Why I Built MegaCleaner

I used to do this manually. Every few months, when my disk got uncomfortably full, I’d open Terminal and start running commands:

rm -rf ~/Library/Developer/Xcode/DerivedData/*
docker system prune -a
find ~ -name "node_modules" -type d -maxdepth 5
# ...and so on, for 20 minutes

The problem isn’t that it’s hard. Each command is simple. The problem is that it’s tedious, you have to remember all the paths, and you inevitably forget something. I’d clean up 40GB, feel good, and then three months later realize Docker build cache had quietly grown to 15GB again.

So I automated it. First as a script for myself, then as a proper macOS app with a UI, safety checks, and the ability to see everything in one scan.

That became MegaCleaner.

Try It Yourself

The scan is free. You can download MegaCleaner, run it, and see exactly how much space your dev tools are wasting — no purchase required.

If you want to actually clean, it’s $49 one-time. Not a subscription. You buy it once, it’s yours.

Given that a 512GB to 1TB MacBook upgrade costs Apple $200, reclaiming 100+ GB for $49 is a decent deal.

Download MegaCleaner — free scan, see what your Mac is hiding.


I’m Sergey — I build developer tools at Morco Labs. MegaCleaner is a native macOS app, built in Swift, no Electron, no subscriptions. If you have questions or feedback, find me on Twitter/X.